A general backpropagation algorithm for feedforward neural networks learning

نویسندگان

  • Xinghuo Yu
  • Mehmet Önder Efe
  • Okyay Kaynak
چکیده

A general backpropagation algorithm is proposed for feedforward neural network learning with time varying inputs. The Lyapunov function approach is used to rigorously analyze the convergence of weights, with the use of the algorithm, toward minima of the error function. Sufficient conditions to guarantee the convergence of weights for time varying inputs are derived. It is shown that most commonly used backpropagation learning algorithms are special cases of the developed general algorithm.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A backpropagation learning framework for feedforward neural networks

In this paper, a general backpropagation learning framework for the training of feedforward neural networks is proposed. The convergence to global minimum under the framework is investigated using the Lyapunov stability theory. It is shown the existing feedforward neural networks training algorithms are special cases of the proposed framework.

متن کامل

Merging Echo State and Feedforward Neural Networks for Time Series Forecasting

Echo state neural networks, which are a special case of recurrent neural networks, are studied from the viewpoint of their learning ability, with a goal to achieve their greater prediction ability. A standard training of these neural networks uses pseudoinverse matrix for one-step learning of weights from hidden to output neurons. Such learning was substituted by backpropagation of error learni...

متن کامل

Optimization-based learning with bounded error for feedforward neural networks

An optimization-based learning algorithm for feedforward neural networks is presented, in which the network weights are determined by minimizing a sliding-window cost. The algorithm is particularly well suited for batch learning and allows one to deal with large data sets in a computationally efficient way. An analysis of its convergence and robustness properties is made. Simulation results con...

متن کامل

Supervised Scaled Regression Clustering: An Alternative to Neural Networks

This paper describes a rather novel method for the supervised training of regression systems that can be an alternative to feedforward Artificial Neural Networks (ANNs) trained with the BackPropagation algorithm. The proposed methodology is a hybrid structure based on supervised clustering with genetic algorithms and local learning. Supervised Scaled Regression Clustering with Genetic Algorithm...

متن کامل

Optimized Learning with Bounded Error for Feedforward Neural Networks

A learning algorithm for feedforward neural networks is presented that is based on a parameter estimation approach. The algorithm is particularly well-suited for batch learning and allows one to deal with large data sets in a computationally efficient way. An analysis of its convergence and robustness properties is made. Simulation results confirm the effectiveness of the algorithm and its adva...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 13 1  شماره 

صفحات  -

تاریخ انتشار 2002